Skip to content

Conversation

@flemx
Copy link

@flemx flemx commented Nov 20, 2025

Description:

Include reasoning and thinking blocks inside additional_args for AIMessage when using reasoning models with custom llm gateway proxies.

This makes the openai chatmodel compatible with OpenAI compatible LLM Gateways such as the LiteLLM proxy with reasoning and thinking models like anthropic and deepseek:
https://docs.litellm.ai/docs/reasoning_content

Issue:
Fixes #30530

Dependencies:

  • needs httpx in test dependencies of the openai partner library in order to ignore the self signed certs when using custom proxies for the integration test

Lint and test

  • make format — passed
  • make lint — passed
  • make test — passed
  • make integration_tests — passed

@flemx flemx requested review from ccurme and mdrxy as code owners November 20, 2025 15:24
@github-actions github-actions bot added integration Related to a provider partner package integration dependencies Pull requests that update a dependency file openai labels Nov 20, 2025
@flemx flemx changed the title Fix(openai): Include support for OpenAI chatmodel reasoning blocks using LiteLLM proxy feat(openai): Include support for OpenAI chatmodel reasoning blocks using LiteLLM proxy Nov 20, 2025
@codspeed-hq
Copy link

codspeed-hq bot commented Nov 20, 2025

CodSpeed Performance Report

Merging #34049 will not alter performance

Comparing flemx:flemx/openai-reasoning-blocks (c9eb084) with master (525d5c0)1

Summary

✅ 6 untouched
⏩ 28 skipped2

Footnotes

  1. No successful run was found on master (ee3373a) during the generation of this report, so 525d5c0 was used instead as the comparison base. There might be some changes unrelated to this pull request in this report.

  2. 28 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

@github-actions github-actions bot added feature and removed feature labels Nov 20, 2025
@flemx
Copy link
Author

flemx commented Nov 21, 2025

TODO:

  • Also need to include them inside the streaming chunks

@flemx flemx changed the title feat(openai): Include support for OpenAI chatmodel reasoning blocks using LiteLLM proxy feat(openai): Include support for reasoning blocks when using openai compatible llm gateways Nov 21, 2025
@github-actions github-actions bot added feature and removed feature labels Nov 21, 2025
@github-actions github-actions bot added feature and removed feature labels Nov 21, 2025
@flemx
Copy link
Author

flemx commented Nov 21, 2025

Simplified implementation to only add reason_content to additional_kwargs in the message object, to make it compatible with more external openai compatible modals.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file feature integration Related to a provider partner package integration openai

Projects

None yet

Development

Successfully merging this pull request may close these issues.

openai: thinking_block field returned by Claude model ignored by ChatOpenAI().invoke()

1 participant